absolutely continuous random variable

абсолютно непрерывная случайная величина

English-Russian scientific dictionary. 2008.

Смотреть что такое "absolutely continuous random variable" в других словарях:

  • Continuous probability distribution — In probability theory, a probability distribution is called continuous if its cumulative distribution function is continuous. That is equivalent to saying that for random variables X with the distribution in question, Pr [ X = a ] = 0 for all… …   Wikipedia

  • Singular function — The graph of the winding number of the circle map is an example of a singular function. In mathematics, a singular function is any function ƒ defined on the interval [a, b] that has the following properties: ƒ is continuous on [a, b]. (**) there… …   Wikipedia

  • Probability distribution — This article is about probability distribution. For generalized functions in mathematical analysis, see Distribution (mathematics). For other uses, see Distribution (disambiguation). In probability theory, a probability mass, probability density …   Wikipedia

  • Expected value — This article is about the term used in probability theory and statistics. For other uses, see Expected value (disambiguation). In probability theory, the expected value (or expectation, or mathematical expectation, or mean, or the first moment)… …   Wikipedia

  • Probability density function — Boxplot and probability density function of a normal distribution N(0, σ2). In probability theory, a probability density function (pdf), or density of a continuous random variable is a function that describes the relative likelihood for this… …   Wikipedia

  • Cumulative distribution function — for the normal distributions in the image below …   Wikipedia

  • Conditional expectation — In probability theory, a conditional expectation (also known as conditional expected value or conditional mean) is the expected value of a real random variable with respect to a conditional probability distribution. The concept of conditional… …   Wikipedia

  • Kullback–Leibler divergence — In probability theory and information theory, the Kullback–Leibler divergence[1][2][3] (also information divergence, information gain, relative entropy, or KLIC) is a non symmetric measure of the difference between two probability distributions P …   Wikipedia

  • Information theory and measure theory — Measures in information theory = Many of the formulas in information theory have separate versions for continuous and discrete cases, i.e. integrals for the continuous case and sums for the discrete case. These versions can often be generalized… …   Wikipedia

  • Joint probability distribution — In the study of probability, given two random variables X and Y that are defined on the same probability space, the joint distribution for X and Y defines the probability of events defined in terms of both X and Y. In the case of only two random… …   Wikipedia

  • Characteristic function (probability theory) — The characteristic function of a uniform U(–1,1) random variable. This function is real valued because it corresponds to a random variable that is symmetric around the origin; however in general case characteristic functions may be complex valued …   Wikipedia

Книги



Поделиться ссылкой на выделенное

Прямая ссылка:
Нажмите правой клавишей мыши и выберите «Копировать ссылку»

We are using cookies for the best presentation of our site. Continuing to use this site, you agree with this.